Moshe Gur: On the unity of perception - How does the brain integrate activity evoked at different cortical loci?
Talk by Moshe Gur, Dept. of Biomedical Engineering, Technion, Israel Institute of Technology. Given to the Redwood Center for Theoretical Neuroscience at UC Berkeley.
Audio/Visual sound, color
Any physical device we know, including computers, when comparing A to B must send the information to point C. I have done experiments in three modalities, somato-sensory, auditory, and visual, where 2 different loci at the primary cortex are stimulated and I argue that the "machine" converging hypothesis cannot explain the perceptual results. Thus we must assume a non-converging mechanism whereby the brain, at times, can compare (integrate, process) events that take place at different loci without sending the information to a common target. Once we allow for such a mechanism, many phenomena can be viewed differently. Take for example the question of how and where does multi-sensory integration take place; we perceive a synchronized talking face yet detailed visual and auditory information are represented at very different brain loci.